Principal components analysis of head‐related transfer functions
نویسندگان
چکیده
منابع مشابه
Persian Handwriting Analysis Using Functional Principal Components
Principal components analysis is a well-known statistical method in dealing with large dependent data sets. It is also used in functional data for both purposes of data reduction as well as variation representation. On the other hand "handwriting" is one of the objects, studied in various statistical fields like pattern recognition and shape analysis. Considering time as the argument,...
متن کاملFunctional principal components analysis of workload capacity functions.
Workload capacity, an important concept in many areas of psychology, describes processing efficiency across changes in workload. The capacity coefficient is a function across time that provides a useful measure of this construct. Until now, most analyses of the capacity coefficient have focused on the magnitude of this function, and often only in terms of a qualitative comparison (greater than ...
متن کاملRepresentation of Head Related Transfer Functions with Principal Component Analysis
Head Related Transfer Functions (HRTFs) describe the changes in the sound wave as it propagates from a spatial sound source to the human eardrum. One possible representation of HRTF data is the use of Principal Component Analysis (PCA), which decomposes data to principal components and corresponding weights. We applied PCA to MIT Media Lab nonindividualized HRTF library. The linear amplitudes o...
متن کاملOnline Principal Components Analysis
We consider the online version of the well known Principal Component Analysis (PCA) problem. In standard PCA, the input to the problem is a set of ddimensional vectors X = [x1, . . . ,xn] and a target dimension k < d; the output is a set of k-dimensional vectors Y = [y1, . . . ,yn] that minimize the reconstruction error: minΦ ∑ i ‖xi − Φyi‖2. Here, Φ ∈ Rd×k is restricted to being isometric. The...
متن کاملPrincipal Components Analysis
Derivation of PCA I: For a set of d-dimensional data vectors {x}i=1, the principal axes {e}qj=1 are those orthonormal axes onto which the retained variance under projection is maximal. It can be shown that the vectors ej are given by the q dominant eigenvectors of the sample covariance matrix S, such that Sej = λjej . The q principal components of the observed vector xi are given by the vector ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: The Journal of the Acoustical Society of America
سال: 1990
ISSN: 0001-4966
DOI: 10.1121/1.2029241